MAP Estimation, Message Passing, and Perfect Graphs
نویسنده
چکیده
Efficiently finding the maximum a posteriori (MAP) configuration of a graphical model is an important problem which is often implemented using message passing algorithms. The optimality of such algorithms is only well established for singly-connected graphs and other limited settings. This article extends the set of graphs where MAP estimation is in P and where message passing recovers the exact solution to so-called perfect graphs. This result leverages recent progress in defining perfect graphs (the strong perfect graph theorem), linear programming relaxations of MAP estimation and recent convergent message passing schemes. The article converts graphical models into nand Markov random fields which are straightforward to relax into linear programs. Therein, integrality can be established in general by testing for graph perfection. This perfection test is performed efficiently using a polynomial time algorithm. Alternatively, known decomposition tools from perfect graph theory may be used to prove perfection for certain families of graphs. Thus, a general graph framework is provided for determining when MAP estimation in any graphical model is in P, has an integral linear programming relaxation and is exactly recoverable by message passing.
منابع مشابه
Revisiting MAP Estimation, Message Passing and Perfect Graphs
Given a graphical model, one of the most useful queries is to find the most likely configuration of its variables. This task, known as the maximum a posteriori (MAP) problem, can be solved efficiently via message passing techniques when the graph is a tree, but is NPhard for general graphs. Jebara (2009) shows that the MAP problem can be converted into the stable set problem, which can be solve...
متن کاملPerfect graphs and MAP estimation
Graphical models have become an indispensible tool in machine learning and applied statistics for representing networks of variables and probability distributions describing their interactions. Recovering the maximum a posteriori (MAP) configuration of random variables in a graphical model is an important problem with applications ranging from protein folding to image processing. The task of fi...
متن کاملSparse Channel Estimation by Factor Graphs
The problem of estimating a sparse channel, i.e. a channel with a few non-zero taps, appears in various areas of communications. Recently, we have developed an algorithm based on iterative alternating minimization which iteratively detects the location and the value of the taps. This algorithms involves an approximate Maximum A Posteriori (MAP) probability scheme for detection of the location o...
متن کاملMaximum a Posteriori Estimation of Dynamically Changing Distributions
This paper presents a sequential state estimation method with arbitrary probabilistic models expressing the system’s belief. Probabilistic models can be estimated by Maximum a posteriori estimators (MAP), which fail, if the state is dynamic or the model contains hidden variables. The last typically requires iterative methods like expectation maximization (EM). The proposed approximative techniq...
متن کاملMMSE denoising of sparse Lévy processes via message passing
Many recent algorithms for sparse signal recovery can be interpreted as maximum-a-posteriori (MAP) estimators relying on some specific priors. From this Bayesian perspective, state-of-the-art methods based on discrete-gradient regularizers, such as total-variation (TV) minimization, implicitly assume the signals to be sampled instances of Lévy processes with independent Laplace-distributed incr...
متن کامل